|
In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is ''stronger'' than the usual statement of the uncertainty principle in terms of the product of standard deviations. In 1957, Hirschman considered a function ''f'' and its Fourier transform ''g'' such that : where the "≈" indicates convergence in 2, and normalized so that (by Plancherel's theorem), : He showed that for any such functions the sum of the Shannon entropies is non-negative, : A tighter bound, was conjectured by Hirschman〔 and Everett,〔Hugh Everett, III. The Many-Worlds Interpretation of Quantum Mechanics: the theory of the universal wave function. (Everett's Dissertation )〕 proven in 1975 by W. Beckner and in the same year interpreted by as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski. The equality holds in the case of Gaussian distributions. Note, however, that the above entropic uncertainty function is distinctly ''different'' from the quantum Von Neumann entropy represented in phase space. ==Sketch of proof== The proof of this tight inequality depends on the so-called (''q'', ''p'')-norm of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.) From this norm, one is able to establish a lower bound on the sum of the (differential) Rényi entropies, , where , which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropic uncertainty」の詳細全文を読む スポンサード リンク
|